Health Science Reports
○ Wiley
Preprints posted in the last 90 days, ranked by how well they match Health Science Reports's content profile, based on 12 papers previously published here. The average preprint has a 0.01% match score for this journal, so anything above that is already an above-average fit.
Mahesh, E.; Sourabha, S.; Yousuff, M.; R, R.; Gurudev, K.; MS, G.; Prabhu, P.
Show abstract
BackgroundCatheter-related bloodstream infection (CRBSI) is a major cause of morbidity and mortality among patients undergoing hemodialysis (HD), particularly in low- and middle-income settings where non-tunneled hemodialysis catheters (NTHC) are widely used. Local epidemiological data are essential to guide preventive and therapeutic strategies. ObjectivesTo determine the prevalence, microbiological profile, antimicrobial resistance patterns, and clinical outcomes of CRBSI in patients undergoing HD via internal jugular NTHC at a tertiary care center in South India. MethodsThis retrospective observational study included adults initiated on HD using internal jugular NTHC between January 2017 and December 2023. Patients with pre-existing infections or catheters inserted elsewhere were excluded. CRBSI was defined using KDOQI criteria. Demographic, clinical, laboratory, microbiological, and outcome data were analyzed. Logistic regression identified risk factors, and receiver operating characteristic (ROC) analysis evaluated predictors of adverse outcomes. ResultsAmong 396 patients (mean age 56.3 {+/-} 14 years; 70.4% male), 65 (16.4%) developed CRBSI, with an incidence of 4.7 per 1000 catheter days. Emergency HD initiation (OR 14.86, p < 0.001) and access failure (OR 2.71, p = 0.004) significantly increased CRBSI risk, while planned initiation for uremic symptoms was protective. Patients with CRBSI had lower serum albumin and higher leukocyte counts. Gram-negative organisms predominated (53.8%), with Klebsiella pneumoniae being the most common isolate. High resistance was observed to {beta}-lactam/{beta}-lactamase inhibitor combinations and carbapenems. Gram-negative CRBSI was associated with significantly higher odds of hospitalization, ICU admission, inotropic support, and mortality. ROC analysis showed good predictive ability for adverse outcomes (AUC 0.73-0.77). ConclusionsCRBSI remains a significant complication of NTHC-based HD. Predominant Gram-negative infections with high antimicrobial resistance are associated with worse clinical outcomes, underscoring the need for early permanent access creation, strict catheter care, and robust antibiotic stewardship.
Diaz, M. M.; Enders, K.; Tovar-Ramirez, S.; Rodriguez-Angeles, Y.; Roldan, V.; Nolasco, M.; Zou, Y.; She, J.; Sotolongo, P.; Mejia, F.; Valcour, V.; Garcia, P. J.; Marquine, M. J.; Tsoy, E.
Show abstract
IntroductionNeurocognitive impairment (NCI) remains common among people living with HIV (PWH), particularly in low- and middle-income countries where accurate diagnostic tools are limited. In Peru, the lack of locally validated neuropsychological (NP) normative data in Spanish poses a major barrier to diagnosing HIV-associated NCI, especially among PWH who develop NCI at younger ages. This study aimed to develop regression-based NP norms for young and middle-aged Spanish-speaking adults in Lima, Peru and validate the norms in demographically similar PWH to improve diagnostic precision of HIV-associated NCI. MethodsA total of 164 healthy adults without HIV from Lima completed a comprehensive NP battery assessing memory, attention, executive function, and language, which are commonly affected in HIV-associated NCI. Multiple regression models were used to consider the influence of age, years of education, and sex on raw scores, yielding standardized demographically-adjusted norms for the population. The resulting norms were then applied to 310 PWH from Lima and then compared with previously published norms for Spanish speaking adults to evaluate performance differences. ResultsAge and education were the strongest predictors of performance across tests, while sex had minimal influence. Compared to people without HIV, PWH had significantly lower educational attainment (mean 12.6 vs. 13.7 years) and exhibited significantly worse performance on normed scores of Benson Figure Copy, Benson Figure Delayed Recall, Color Trails 1 and 2, Hopkins Verbal Learning Test - Revised, and WAIS-III Digit Symbol Coding, Digit Span, and Symbol Search. There were statistically significant differences between T-scores on nearly all tests between our population-specific norms and previously published norms in both directions, indicating potential over- and under-detection errors when applying norms from non-local samples. DiscussionOur findings highlight the utility of locally derived norms in detecting subtle cognitive changes among young and middle-aged PWH compared with previously published norms for Spanish-speakers. Application of these norms reveals significant between-group differences that may go undetected using non-local normative data or raw scores. Future efforts should focus on rural norm development and inclusion of individuals with lower educational backgrounds in Peru and other Latin American countries.
Tefera, B.; Ali, R.; Megersa, B. S.; Girma, T.; Friis, H.; Abera, M.; Belachew, T.; Olsen, M. F.; Filteau, S.; Wells, J. C.; Wibaek, R.; Yilma, D.; Nitsch, D.
Show abstract
IntroductionGlomerular filtration rate (GFR) is invasive to measure. Therefore, in clinical care, estimated GFR is derived from serum levels of endogenous filtration markers such as creatinine and cystatin C. Multiple studies from high income countries showed differences between estimated glomerular filtration rate based on cystatin C (eGFRcys) and creatinine (eGFRcr). This study aimed to assess the agreement between eGFRcys and eGFRcr in Ethiopian children and identify factors influencing higher eGFRcys and eGFRcr. MethodWe studied 350 Ethiopian children who were part of the iABC birth cohort study. At the recent follow-up (average age 10 years), serum cystatin C and creatinine were measured. Formulas by Berg (2015) and Hoste (2014) were used to estimate eGFRcys and eGFRcr, respectively, and Bland-Altman plots assessed their agreement. The difference in eGFR (eGFRdiff) was calculated and categorized as less than-15 mL/min/1.73 m{superscript 2} (higher eGFRcr), between-15 and <15 mL/min/1.73 m{superscript 2} (concordant), and greater than or equal to 15 mL/min/1.73 m{superscript 2} (higher eGFRcys). Multinomial logistic regression was used to identify factors associated with higher eGFRcr and higher eGFRcys. ResultEstimated glomerular filtration rate (eGFR) showed significant variation based on the estimation formula used. When using formulas by Berg (2015) and Hoste (2014), the median (IQR) eGFRcys and eGFRcr were 99.4 (90.0; 114.1), and 123.2 (110.3; 143.8) mL/min/1.73 m2, respectively. Overall, we observed a poor agreement between eGFRcys and eGFRcr, with only 94 (27.6%) children having concordant results compared to 220 (64.7%) with higher eGFRcr and 26 (7.6%) with higher eGFRcys. If the eGFRcys results are considered reliable, 27.5% of the children had eGFR below 90 mL/min/1.73 m{superscript 2}. ConclusionThere was very marked variation in the distributions of estimated eGFRs depending on which formulas for children were used. Agreement between eGFR estimated using cystatin C and creatinine was poor among Ethiopian children. Relative to eGFRcys, kidney function may be overestimated by creatinine-based equation as up to 30ml/min in Ethiopia. Ideally, a validation study with GFR measured by gold standard methods (Inlulin clearance) among children is required. However, because of its invasive nature and financial concerns, Iohexol clearance studies are recommended.
Forster, A.; Rehman, F.; Moist, L.; Holden, R.; Thomson, B. K.
Show abstract
IntroductionCatastrophic bleeding can be fatal in patients on hemodialysis using Arteriovenous (AV) fistulas or grafts. Campaigns, such as the UK "Put a Lid On It" and the Australia "Stop the Bleed" have recommended use of bleeding cessation devices, but evidence for their use remains limited. Recent creation of the bleeding cessation device "Kidney-CAP" mandated further study. The objective of this study was to determine how the Kidney-CAP modified decisions related to vascular access, dialysis modality, and kidney transplantation. MethodsCross-sectional surveys were administered at a Canadian academic nephrology program, to health care providers (HCP) managing patients with chronic kidney disease (CKD), to patients on hemodialysis (CKD-HD), and to patients with CKD but not on dialysis (CKD-Clinic). Two tailed, one sample sign test was used to determine if the median response to Likert scale questions differed from "no effect" response, to a p-value of < 0.05. ResultsSurvey respondents included 18 HCP, 23 CKD-HD and 30 CKD-Clinic patients. Having a Kidney-CAP increased CKD-Clinic patients desire to undergo AVF or AVG creation (p=0.020). Having a Kidney-CAP had no impact on CKD-HD patients deside to undergo AVF creation, or to pursue hemodialysis at home, but increased desire to undergo kidney transplantation (p=0.031). Availability of the Kidney-CAP had no impact on HCP recommendations related to AVF creation or kidney transplantation, but increased HCP recommendations for home hemodialysis in ESKD patients (p=0.039 for each). ConclusionThis is the first study to assess the perceived benefit of a bleeding cessation device, with a focus on clinical decision making related to vascular access, kidney transplantation, and dialysis modality. The Kidney-CAP is associated with increased patient uptake of kidney transplantation and creation of AVF. Further study is required to delineate patient decisions within demographic subgroups such as previous kidney transplant, or anticoagulation status.
Convento, M. B.; Borges, F. T.
Show abstract
IntroductionChronic kidney disease imposes a high clinical and economic burden on the Brazilian Unified Health System, and kidney transplantation offers the best prognosis. ObjectiveTo describe trends in living kidney (LD) donation in Brazil (2010-2023), analyzing the donor-recipient relationship and the operational stock-to-annual production ratio on the waiting list, and to compare hospital indicators and estimated patient and graft survival between LD and deceased-donor (DD) kidney transplants. MethodsDescriptive ecological time-series study using aggregated, publicly available data. ResultsThe waiting list increased by 15% (from 33,253 to 38,258), and the total number of transplants rose by 29% (from 4,656 to 6,047). Data showed an increase in deceased-donor transplants (from 3,001 to 5,189) and a decrease in LD transplants (from 1,655 to 858), with the LD share declining from 35.55% to 14.19% and the per-million-population rate falling from 8.8 to 4.2. Among LD, there was a relative decrease in related donors (from 82.80% to 71.21%), a relative increase in unrelated spouse donors (from 10.57% to 18.65%), and in other unrelated donors (from 6.63% to 10.14%). Comparatively, LD showed better descriptive performance on survival indicators and lower in-hospital mortality, length of stay, and mean Hospital Admission Authorization value. ConclusionThe findings indicate a need for strategies to sustain DD procurement and LD donation.
Neze-Sebakunzi, J.; Doro Altan, A.-M.; Ceffa, S.; Guidotti, G.; Capparucci, S.; Ciccacci, F.; Musikingala, M.; Nkuba-Ndaye, A.; Makangara-Cigolo, J.-C.; Kabeya-Mampuela, T.; Orlando, S.; Ahuka-Mundeke, S.
Show abstract
BackgroundCervical cancer is one of the most common cancers in women, particularly among women living with HIV (WLWH). Persistent infection with High-risk oncogenic human papillomavirus (Hr-HPV) is the primary etiological factor. However, data on Hr-HPV prevalence among WLWH in Kinshasa, Democratic Republic of the Congo, remains poorly documented. This study aimed to determine the prevalence of Hr-HPV infection and identify associated risk factors in this population. MethodsA cross-sectional study was conducted on WLWH aged 25 to 65 years receiving antiretroviral therapy at the DREAM Centre in Kinshasa. Cervical sample were collected and analysing using multiplex PCR for detection of Hr-HPV genotypes. Sociodemographic data and risk factors were collected via questionnaires, and associations with Hr-HPV infection were assessed using multivariate logistic regression. ResultsA total of 436 women were included. The prevalence of Hr-HPV infection was 47.25%. HPV types 16 and 18 (alone or in co-infection) were detected in 23.79% of participants. In a multivariate logistic regression analysis, WHO clinical stage 3-4 (aOR 1.75; 95% CI 1.16-2.64; p=0.008) and HIV viral load [≥]1000 copies/mL (aOR 3.08; 95% CI 1.28-7.42; p=0.012) and Antiretroviral therapy duration <2 years (aOR 0.52; 95% CI 0.29-0.93; p=0.028) were significantly associated with Hr-HPV infection. ConclusionsNearly one in two WLWH in Kinshasa was infected with Hr-HPV, and one in four carried HPV-16/18 genotypes. Advanced HIV disease and uncontrolled viral replication were strongly associated with Hr-HPV infection. These findings underscore the urgent need to integrate systematic Hr-HPV screening into HIV care programs, particularly for women with advanced clinical stage or persistent viremia.
Koyra, A. B.; Mohammed, F.; Eshete, T.
Show abstract
BackgroundFamily-based HIV index case testing identifies family members with unknown HIV status and links them to care. Data are limited in southern Ethiopia. MethodsA facility-based cross-sectional study was conducted among 377 adults on antiretroviral therapy (ART) in Wolaita Zone, Southern Ethiopia, from November 2022 to May 2023. Participants were selected using systematic random sampling. Data were collected via interviewer-administered semi-structured questionnaire. Multivariable logistic regression identified factors associated with index case family testing. Adjusted odds ratios (AOR) with 95% confidence intervals (CI) were calculated, and statistical significance was declared at p < 0.05. ResultsThe proportion of index case family testing for HIV was 84.9% (95% CI: 81.2- 88.6). In multivariable analysis, urban residence (AOR = 2.8; 95% CI: 1.16-6.75), duration on ART greater than 12 months (AOR = 13.0; 95% CI: 4.6-36.9), disclosure of HIV status to family members (AOR = 5.6; 95% CI: 1.9-16.5), discussion of HIV status with family members (AOR = 6.6; 95% CI: 1.9-23.2), and being counselled by health professionals to bring families for testing (AOR = 6.3; 95% CI: 2.1-19.0) were significantly associated with index case family testing. ConclusionThe prevalence of family-based HIV index case testing in Wolaita Zone was 84.9%, below the national 95% target. Health professionals should strengthen counselling on ART adherence, status disclosure, family discussion, and active referral to improve testing uptake among family members of people living with HIV.
Lescano, J. I. O.; Belangoy, K. P.; Nishimura, Y.; Harada, K.; Hagiya, H.; Vu, Q.; Ouddoud, H.; See, G. L. L.; Arce, F. V.; Tan, E. Y.; Iwata, N.; Takeda, T.; Zamami, Y.; Koyama, T.
Show abstract
BackgroundStroke is a leading cause of mortality and disability globally. However, information about stroke burden in the Philippines is limited. We sought to analyze stroke burden in the Philippines from 1990 to 2023. MethodsIncidence, prevalence, mortality, and disability-adjusted life-years (DALYs) estimates from the Global Burden of Disease Study 2023 data were used as indicators to analyze the burden of stroke by sex and age. Temporal trends in both crude and age-standardized rates were analyzed using joinpoint regression analysis. ResultsIn 2023, stroke incidence was estimated at 156.2 (95% uncertainty interval [UI]: 140.8-175.4) thousand, prevalence at 1.2 (95% UI: 1.2-1.4) million, mortality at 72.2 (95% UI: 63.2-83.0) thousand, and DALYs at 2.1 (95% UI: 1.8-2.3) million. High systolic blood pressure was the leading contributor to risk-attributable stroke mortality and DALYs. Since 1990, age-standardized rates declined significantly, whereas crude rates increased markedly. Compared with women, men had a higher fatal burden and consistently exhibited a higher age-standardized burden. Although older adults ([≥] 55 years) had the highest stroke burden and achieved reductions in stroke incidence and fatal outcomes, both fatal and non-fatal burdens consistently increased among young adults (35-54 years). ConclusionWhile age-standardized rates have improved, the rising crude burden and shift towards younger adults present significant public health challenges. These trends highlight the pressing need for aggressive and targeted risk factor control, sustained risk monitoring, and strengthened acute and post-stroke care to mitigate the growing health burden of stroke in the Philippines.
Sada, K.-e.; Yamazaki, H.; Wakita, T.; Yamamoto, Y.; Wang, J.; Onishi, Y.; Hamada, T.; Ide, R.; Takeda, M.; Fukuhara, S.; Shibagaki, Y.
Show abstract
Background. Hyperkalemia is common in chronic kidney disease (CKD) and chronic heart failure (CHF), often leading to treatment dilemmas regarding renin-angiotensin-aldosterone system (RAAS) inhibitors. Although potassium binders and dietary restrictions are central to chronic management, their quality-of-life (QOL) impact remains insufficiently described. This study aimed to characterize real-world treatment patterns and evaluate treatment impact on QOL. Methods. We analyzed baseline data from a prospective cohort in Japanese nephrology and cardiology outpatient clinics. Participants were adults with CKD ([≥] stage G3) or CHF (New York Heart Association class II-IV) who initiated potassium binders within 6 months. Clinical data, serum potassium values, and patient-reported outcomes (generic QOL, disease/treatment-specific QOL, and adherence measures) were obtained at enrollment. Results. Among 347 patients, the median age was 75 years, and 74% were male; 93% had CKD. At enrollment, 300 patients were receiving potassium binders, and 59% were prescribed a RAAS inhibitor. Dietary therapy was implemented in 29%. Physical scores of generic QOL were lower than population norms, whereas mental scores were comparable. Treatment-specific QOL scores indicated that potassium binders had a smaller impact on QOL than dietary therapy. Adherence to potassium binders was high. Conclusions. Concurrent use of RAAS inhibitors and potassium binders was common, suggesting that binders may support RAAS inhibitor continuation. Potassium binders showed less perceived impact than dietary restrictions, indicating that pharmacologic potassium control may be acceptable to patients managing multiple lifestyle limitations. These findings highlight the role of potassium binders in maintaining both RAAS inhibitor therapy and QOL.
Mazumder, A.; Pintea, S. D.; Chen, L.; Mazumder, A.; Kopp, J. B.
Show abstract
Chronic kidney disease of unknown etiology (CKDu) has emerged as an important public health challenge, particularly in agricultural communities across Southern Asia and Central America. Our research aims to explore the role of environmental factors in contributing to CKDu prevalence in these regions. Using an Extreme Gradient Boosting Machine Learning (XGBoost) model, we analyzed an environmental dataset from the CKDu endemic region of Sri Lanka. The XGBoost model achieved 85% accuracy in predicting CKDu prevalence in a total of 100 locales. Significant predictor variables included fluoride concentration in water, electrical conductivity of drinking water (EC), pH, and soil type. Fluoride, a common contaminant in drinking water, was the most influential factor, followed by EC and pH, which influence the solubility and bioavailability of nephrotoxic chemicals in water sources. The study findings highlight the urgent need for targeted water analysis programs and interventions in water quality management, agrochemical usage, and soil treatment in CKDu-endemic regions. These insights also provide a framework for future research to identify causative agents and develop strategies for reducing CKDu prevalence.
Padhi, A.; Bera, J. H.; Rajyaguru, B.; Chauhan, J.; Rank, D.; Modasiya, I.; Bhalani, S.; Agarwal, A.
Show abstract
BackgroundDengue virus infection remains a significant public health concern in India, with changing serotype dynamics influencing disease epidemiology. Understanding local serotype distribution and clinical characteristics is crucial for effective disease management and surveillance. ObjectivesTo determine the prevalence of dengue virus serotypes and analyze their clinical characteristics among NS1-positive patients at a tertiary-care hospital in Gujarat, India. MethodsA cross-sectional study was conducted on NS1-positive dengue patients admitted to AIIMS Rajkot from September 2023 to November 2024. Real-time reverse transcription polymerase chain reaction (RT-PCR) was performed for serotype identification. Clinical and demographic data were collected and analyzed. ResultsNS1-positive patients (70) were confirmed by RT-PCR. DENV-2 was the predominant serotype (53 cases, 75.7%), followed by DENV-1 and DENV-3 (7 cases each, 10.0%), and DENV-4 (2 cases, 2.9%). One co-infection case (DENV-2 + DENV-3) (1.4%) was identified. The mean age was 27.7 {+/-} 14.4 years, with male predominance (58.6%). Young adults (19-35 years) were most affected (45.7%), followed by pediatric patients [≤]18 years (32.9%). Severe dengue occurred in only one case (1.4%), while hospitalization was required in 25 cases (35.7%). All patients presented with fever, chills, headache (50%), rashes (56%), and malaise (56%), being the most common associated symptoms. ConclusionsDENV-2 showed clear predominance in the Rajkot region during the study period, with low rates of severe disease. The significant pediatric and young adult involvement highlights the need for targeted prevention strategies. These findings contribute to the understanding of regional dengue epidemiology and support evidence-based surveillance and control measures.
Bot, M.; Penninx, B. W.
Show abstract
BackgroundWorldwide, common mental disorders such as anxiety and depression are major contributors to disability. However, the role of diet as a risk factor for anxiety and depression remains underexplored. Therefore, we investigated the associations between food groups and major depressive disorder (MDD) and anxiety disorders, following a harmonized protocol to enable integration of studies. MethodsWe analysed data from 1,634 participants in the Netherlands Study of Depression and Anxiety to examine cross-sectional associations between 14 dietary exposures--derived from a 238-item Food Frequency Questionnaire (fruit, vegetables, legumes, whole grains, nuts and seeds, milk, red meat, processed meat, sweet drinks, fibre, calcium, omega-3 fatty acids, polyunsaturated fatty acids, and trans fats)--and anxiety and depressive disorders in the past month (assessed with the Composite International Diagnostic Interview). Secondary outcomes were depressive symptoms (Quick Inventory of Depressive Symptomatology score [≥]13 vs. <13) and anxiety symptoms (Beck Anxiety Index score [≥]16 vs. <16). Logistic regression analyses were conducted for each dietary exposure, with depression and anxiety measures as outcomes. Results8.7% had MDD and 14.4% had an anxiety disorder in the past month. Higher vegetable intake was associated with lower odds of depression and anxiety disorders. Additionally, higher intakes of omega-3 fatty acids, red meat, whole grains, and fibre were associated with lower odds of depression and anxiety, whereas higher intake of trans fats was associated with increased odds of these disorders. Other dietary exposures were not significantly related to depression or anxiety. DiscussionCertain dietary exposures, particularly vegetables, as well as omega-3 fatty acids, red meat, whole grains, and fibre, were associated with depression and anxiety outcomes. These findings may contribute to integration of results in Global Burden of Diseases initiatives on exploring dietary risk factors of depression and anxiety.
Nguyen, D.; Tate, C.; Akaraci, S.; Wang, R.; Kee, F.; Mullineaux, S.; ONeill, C.; Cleland, C.; Murtagh, B.; Ellis, G.; Bryan, D.; Longo, A.; Garcia, L.; Clarke, M.; Hunter, R. F.
Show abstract
BackgroundEvidence on the long-term impact of urban green and blue spaces (UGBS) interventions remains limited. This study is a 15-year evaluation of an urban greenway development in Belfast (United Kingdom), assessing the potential effects of this UGBS intervention on physical activity (PA), mental wellbeing and co-benefits. MethodsUsing quasi-experimental design, a repeated cross-sectional survey was conducted in 2010 (baseline), 2017 (post-opening) and 2023 (long-term follow-up) with about 1,200 adults participated each wave. Outcomes included PA, mental wellbeing, general health, quality of life, social capital and environmental perception. Multilevel mixed-effect regressions were performed to examine within-group changes at long-term follow-up. Difference-in-differences analysis investigated the between-group changes that might be attributed to the greenway. Additional comparative analyses included distance-decay analysis and comparison with population trends in Northern Ireland. ResultsAt six years after completion, the greenway intervention appears to buffer a decline in duration of PA - mainly from moderate-intensity activity (decline lower by 118.6 min/week, 95%CI: 3.9-232.2) but with no significant impact on the proportion of the population meeting the recommended PA level. The intervention is associated with a smaller decline in self-rated health (4.98 units; 95%CI: 0.62-9.34) relative to control group. Intervention association with mental wellbeing was positive but not significant (p=0.30). The greenway also showed positive effects on social capital and environmental perceptions, with impacts most evident in improving safety and trust in the local area. ConclusionThis study provides evidence to support the public health impact of UGBS and its long-term health and social benefits.
Caplin, B.; Agarwal, S.; Day, A.; Al-Rashed, A.; Oomatia, A.; Gonzalez-Quiroz, M.; Pearce, N.
Show abstract
IntroductionThere remains considerable debate as to the cause of the epidemic of Mesoamerican Nephropathy (MeN). We have previously reported early loss of estimated glomerular filtration rate (eGFR) as a surrogate for disease onset in a population-representative cohort study of young-adults at risk of disease from Northwest Nicaragua. Using a nested case-control approach we analysed urine and serum proteins surrounding this timepoint with the aim of gaining insight into the primary disease aetiology. MethodsWe conducted label-free ultra high-performance liquid chromatography mass-spectrometry based proteomics using urine samples collected at the study visit before, and at, first observed eGFR loss amongst cases and compared results to matched controls. We then performed direct protein measurements in a discovery cohort followed by quantification of serum total immunoglobulin E (stIgE) at multiple timepoints in a replication cohort. ResultsProteomic analysis demonstrated no differences in the levels of any single protein between cases and controls (n=25 each), at either timepoint, after correction for multiple comparisons. However, functional enrichment analysis demonstrated upregulation of adaptive immune pathways amongst cases. Direct measurements in the discovery cohort using high-sensitivity PCR-based immunoassay (n=21 controls, 19 cases) demonstrated higher stIgE in cases at the study visit immediately prior to first observed eGFR loss (mean difference 810kU/L, 95% confidence interval (CI): 162-1457kU/L). In the replication cohort (n=22 cases, 21 controls) an stIgE level >500kU/L measured by electrochemiluminescence in study samples from any timepoint in the 3 years prior to the first observed loss of eGFR was independently associated with case status when compared to samples from controls at matched visits (adjusted Odds Ratio: 8.1, 95% CI: 1.4-47.8). DiscussionA high level of stIgE precedes loss of eGFR in those at risk of MeN. Understanding what leads to this rise is likely to be key to understanding the cause of the MeN epidemic. Lay SummaryMesoamerican nephropathy describes an epidemic-level chronic kidney disease impacting rural working age adults in Central America. Although a number of exposures, including occupational heat exposure, have been proposed the cause of the epidemic, there remains much debate as to the primary aetiology of the disease. In this study we interrogated urine and blood samples from individuals from affected communities at risk of disease both before and after they develop kidney dysfunction. Using two different approaches, analysis of both urine and blood samples provide evidence of upregulation of immunoglobulin-E (IgE) related pathways in the 2-3 years before individuals develop evidence of kidney disease. Infections (particularly those involving parasites) and allergic reactions, but not heat exposure, have been reported to increase IgE levels. Going forwards, understanding the cause of this increase in IgE in individuals at risk of disease is likely to provide insight into the cause of Mesoamerican Nephropathy epidemics.
Biswas, R. S. R.; Moharar, T.; Karim, M. R.; Hasan, M. M.; Biswas, S. K.
Show abstract
IntroductionDengue has been prevalent in a regular fashion in Bangladesh and Chattogram for the last 6-7 years and is showing some serotype twisting. So, the objectives of the present study were to explore the burden of dengue serotypes in Chattogram. MethodsIn this study, 223 Dengue RT-PCR positive patients were evaluated for serotyping. Gender and age group, along with cycle threshold (CT) values, were also collected. Data after collection were compiled, analyzed, and plotted in Microsoft Excel and GraphPad Prism 10.4. Ethical clearance was taken to conduct the study. ResultsAmong 223 patients analyzed, males and females were found near equal (113 and 110). Middle-aged patients were more than the extremes of age. The mean {+/-} SD of age was 33.55 {+/-} 13.67 years. Regarding serotype distributions, isolated Den 1, Den 2 and Den 3 were found 1.3%, 73.1% and 6.7%, respectively. Concurrent infections with multiple serotypes were observed in several patients, most notably the Den 2 and Den 3 combination, which accounted for 14.3% (n=32) of the cases. Other co-infections were less frequent: the Den 1 and Den 2 pairing appeared in 3.6% (n=8) of the cohort, while triple-serotype infections (Den 1, 2, and 3) and Den 3/Den 4 pairings were rare, each occurring in only 0.4% of patients. Statistical analysis of CT values revealed no significant sex-based differences for Den 2 and Den 3. However, significant variations in CT values were observed when comparing Den 1 against both Den 2 and Den 3 (p < 0.05). In contrast, the difference between Den 2 and Den 3 Ct values remained statistically insignificant. ConclusionIn the year 2025, Dengue serotypes 2 and 3 were found to be the most prevalent, both in isolated or in combinations and Den 1 and Den 4 were found minimum. Exposure to multiple serotypes and twisting from one serotype to another might influence the dengue outcome in future, which needs further exploration.
Casagrande, B. P.; Beserra, V. R.; Pisani, L. P.; Ribeiro, A. M.; Estadella, D.
Show abstract
BackgroundObesogenic diets (ODs) are known to trigger metabolic and inflammatory disturbances. However, the effects of short-term OD withdrawal on systemic and neuroinflammatory parameters remain unclear. ObjectivesThis study investigated the short-term effects of OD withdrawal on metabolic, inflammatory, and anxiety-like behaviours in young male Wistar rats. MethodsThree-week-old male Wistar rats were fed either a control (Ct, n=5) or high-sugar/high-fat (HSHF) diet for 14 days. Animals in the HSHF group were further divided into no-withdrawal (NWt, n=5) and withdrawal (Wt, n=5) groups, where Wt received a control diet for 48 hours. Food intake, body mass, adiposity, serum metabolic parameters, hepatic energy stores, inflammatory markers (serum, liver, hypothalamus, hippocampus, mesenteric fat), and oxidative stress markers in the hippocampus were measured. Anxiety-like behaviour was assessed using the elevated plus maze. ResultsOD intake significantly increased caloric intake, visceral adiposity, hepatic glycogen, and TAG levels. The 48-hour withdrawal reduced TAG, induced hyperinsulinemia and hypoglycaemia, and heightened inflammation in mesenteric fat, serum, and the hippocampus. Oxidative stress markers (SOD and MDA) increased in the hippocampus, correlating with elevated serum corticosterone and heightened anxiety-like behaviour in the Wt group compared to the other groups. ConclusionShort-term withdrawal after only two weeks of OD intake exacerbates systemic and neuroinflammation, hippocampal oxidative stress, and anxiety-like behaviours, indicating rapid negative responses to dietary transition. These findings highlight the metabolic and behavioural challenges associated with short-term OD withdrawal and highlight the need for adjunct interventions to mitigate its adverse effects.
Rashid, J. S.; Chacha, S.; Ghaimo, F. E.; Mzilangwe, E. S.; Morawej, Z.; Mhina, C.; Kuganda, S.
Show abstract
BackgroundGlaucoma is identified as one of the leading causes of blindness worldwide. Its chronic nature and the potential for irreversible vision loss contribute to significant distress among affected individuals. Around 25% of individuals with glaucoma are estimated to experience depression, negatively impacting their quality of life and treatment adherence. However, data on the prevalence of depression among people with glaucoma in Tanzania is limited. This study aimed to determine the prevalence and factors associated with depressive symptoms among adults with glaucoma at Muhimbili National Hospital. Materials and methodsA cross-sectional study was conducted involving 297 adults with glaucoma, who were recruited consecutively from the ophthalmology clinic at Muhimbili National Hospital between July and November 2024. Data on biopsychosocial factors were collected using interviewer-administered questionnaires and medical records. Patient Health Questionnaire-9 and Oslo Social Support Scale assessed depressive symptoms and social support, respectively. Data were analyzed using STATA version 16. Logistic regression analyses identified factors associated with probable depression, with statistical significance set at p-value<0.05. ResultsThe mean age of participants was 63.6 years (SD{+/-}12.8), with 159 (53.5%) being female. Prevalence of probable depression was 11.1%, with 8.7% moderate, 2.4% moderately severe, and none reporting severe depressive symptoms. Having moderate social support (AOR 0.14; CI: 0.04-0.47; P=0.001) and strong social support (AOR 0.08; CI: 0.03-0.25; P<0.000) were significantly associated with lower odds of probable depression. ConclusionApproximately 1 in 10 individuals with glaucoma experience depression. Having good social support was identified as a protective factor against depression in people with glaucoma. These findings underscore the need for a multidisciplinary approach integrating psychosocial services into ophthalmology clinics.
Hanif, A. A. M.; Goyal, P.; Colantonio, L. D.; Safford, M. M.; Enogela, E. M.; Reid, R.-J.; Fasokun, M. E.; Akinyelure, O. P.; Bowling, C. B.; Quezada-Pinedo, H.; Sterling, M. R.; Levitan, E. B.
Show abstract
Background: Poor physical performance, measured by gait speed and chair stands, is associated with mortality; associations may differ by history of cardiovascular disease (CVD). Methods: Among 14,137 REasons for Geographic And Racial Differences in Stroke (REGARDS) study participants, gait speed and chair stand times (2013-2016) were categorized into quartiles and a fifth category with those who were unable to complete the test. Associations with adjudicated CVD and all-cause mortality through 2020 were examined among participants with and without history of CVD. Results: Average age was 72.5 {+/-} 8.5 years. Among participants without history of CVD, those in slowest vs. highest gait speed quartile had HRs of 2.01 (95% CI 1.18-3.43) for CVD and 1.66 (1.33-2.07) for all-cause mortality; among those unable to complete the test, HRs were 2.37 (1.12-5.03) for CVD and 2.33 (1.72-3.17) for all-cause mortality. Among participants with history of CVD, slowest gait speed quartile had HRs of 1.28 (0.96-1.72) for CVD and 1.72 (1.45-2.04) for all-cause mortality; HR among those unable to complete the test were 1.87 (1.29-2.70) for CVD and 2.74 (2.22-3.38) for all-cause mortality (p-interaction between with and without history of CVD <0.05). Inability to complete chair stand test was associated with higher mortality in both groups. Conclusions: Poor physical performance was associated with greater CVD-related and all-cause mortality among both individuals with and without a history of CVD, with the highest risks observed among those who were unable to the assessments.
Holford, T. R.; Tam, J.; Jeon, J.; Mok, Y.; Meza, R.
Show abstract
IntroductionMortality and smoking rates vary over time across the US. The Cancer Intervention and Surveillance Modeling Network--Lung Working Group (CISNET-LWG) has developed a smoking history generator to describe the effects smoking on health. This work further refines these parameters and quantifies effects on life expectancy MethodsData from the National Health Interview Survey (NHIS) and the Tobacco Use Supplement to the Current Population Survey (TUS-CPS) were used to estimate smoking history parameters for each state. The age-period-cohort was used in most cases, but an age-cohort mode was used for cessation probabilities. Population mortality data were used to estimate mortality rates for all causes, lung cancer, and non-lung cancer. These were partitioned by smoking status. ResultsCalifornia and Kentucky are states with more or less aggressive tobacco control. The difference between population cohort life expectancy and life expectancy of never smoker was greater for males than for females, and it was greater in Kentucky than California because of higher smoking rates. These differences decreased with time. Similar result are shown for each state. ConclusionsVariation in smoking parameters and mortality trends vary considerably among states. These show variation in exposure to tobacco smoking and their effects on life expectancy. The Southeast region tends to have greater differences from never smokers because of higher smoking rates. However, there are also other factors affecting mortality rates.
Melville, S.; MacKinnon, M.; Michaud, J.
Show abstract
BackgroundLife-sustaining hemodialysis (HD) is onerous for patients, especially those with multiple co-morbidities and advanced age. A standard HD prescription is 720 minutes per week. Alternative HD regiments have been proposed in attempt to maintain quality of life (QOL). Studies are needed to investigate the efficacy and safety of less frequent HD prescriptions in this population. This is an institution-wide observational study in New Brunswick, Canada to compare HD prescriptions and the impact on QOL and mortality. ObjectiveThe purpose of this study is to assess the current HD prescribing practices at a provincial healthcare institution in relation to patient QOL. DesignProspective Observational Study. SettingSingle centre hospital and satellite hemodialysis units. PatientsVoluntarily consented patients undergoing in-centre hemodialysis treatment. MeasurementsObservational clinical data was collected for each study participant from their hospital and dialysis electronic medical records. The KDQOL-36TM questionnaire was used to assess patient-reported quality of life at the time of consent. MethodsAdults undergoing in-centre or satellite site HD for at least 3 months were eligible to participate. Consenting patient participants were grouped by HD prescription whether they were prescribed 720 minutes or more per week or less than 720 minutes per week. All participants completed the KDQOL-36 TM questionnaire to estimate QOL and groups were compared using the Mann-Whitney U statistical test. Emergency department visits, hospitalizations, and mortality were analyzed using a negative binomial regression or a logistic regression. ResultsWe enrolled 140 patient participants; 41 were undergoing less than 720 minutes per week of HD and 99 were undergoing 720 minutes or more of HD per week. Patients who were undergoing less than 720 minutes per week of HD were older [Median (IQR): 76 (72- 81) yrs. vs. 64 (55 - 75) yrs.; p < 0.001], had higher median (IQR) QOL scores on the Symptoms/ Problems List scale on the KDQOL-36 TM questionnaire [79.2 (70.8 - 88.5 vs. 70.8 (62.5 - 81.3); p = 0.0022], and were less likely to present to the emergency department (incident rate ratio 0.52, 95% confidence interval [CI] 0.33-0.81). Mortality was similar between groups, even when adjusted for age and comorbidity score (odds ratio 1.62, 95% CI 0.59-4.49). LimitationsPatient participant enrollment was limited by the single centre nature of this study. As this was an observational study, we did not account for how long the patients had been prescribed less than 720 minutes of hemodialysis. We did not include a frailty assessment of the study participants. A higher number of study participants may have identified significant trends in mortality. ConclusionsThe results of this study show that patients undergoing less than 720 minutes of weekly HD had a higher QOL score for the KDQOL-36 TM Symptoms/ Problems List scale, were less frequently in the emergency department and were not more likely to die than patients undergoing 720 minutes or more of weekly HD. Further studies are required to assess the feasibility and safety of a conservative model of HD prescribing to improve QOL of patients with palliative care treatment goals.